A Model of Recurrent Networks that Learn the Finite Automaton from Given Input-Output Sequences
نویسنده
چکیده
From the viewpoint of applying recurrent neural networks to AI, it is not suitable to use learning algorithms based on optimum control theory. One reason of this problem is lack of correspondence between these algorithms and traditional symbol processing. In this report, we propose a new algorithm for modi ed Elman networks (PEX model). This algorithm is derived from the minimization procedure of nite automata under the correspondence between Elman networks and nite automata. In this algorithm, a network predicts next network state by itself, and this prediction plays an important role in re ning the network. In addition, we examined behavior of this model toward noise and found the network gets robust state-transition against noise. Moreover, we discuss the possibility to deal with CFG by an experiment of sub-grammar learning.
منابع مشابه
معرفی شبکه های عصبی پیمانه ای عمیق با ساختار فضایی-زمانی دوگانه جهت بهبود بازشناسی گفتار پیوسته فارسی
In this article, growable deep modular neural networks for continuous speech recognition are introduced. These networks can be grown to implement the spatio-temporal information of the frame sequences at their input layer as well as their labels at the output layer at the same time. The trained neural network with such double spatio-temporal association structure can learn the phonetic sequence...
متن کاملEfficient encodings of finite automata in discrete-time recurrent neural networks∗
A number of researchers have used discretetime recurrent neural nets (DTRNN) to learn finite-state machines (FSM) from samples of input and output strings; trained DTRNN usually show FSM behaviour for strings up to a certain length, but not beyond; this is usually called instability. Other authors have shown that DTRNN may actually behave as FSM for strings of any length and have devised strate...
متن کاملLearning a deterministic finite automaton with a recurrent neural network
We consider the problem of learning a finite automaton with recurrent neural networks, given a training set of sentences in a language. We train Elman recurrent neural networks on the prediction task and study experimentally what these networks learn. We found that the network tends to encode an approximation of the minimum automaton that accepts only the sentences in the training set.
متن کاملDISTINGUISHABILITY AND COMPLETENESS OF CRISP DETERMINISTIC FUZZY AUTOMATA
In this paper, we introduce and study notions like state-\linebreak distinguishability, input-distinguishability and output completeness of states of a crisp deterministic fuzzy automaton. We show that for each crisp deterministic fuzzy automaton there corresponds a unique (up to isomorphism), equivalent distinguished crisp deterministic fuzzy automaton. Finally, we introduce two axioms related...
متن کاملKnowledge Extraction and Recurrent Neural Networks: An Analysis of an Elman Network trained on a Natural Language Learning Task
We present results of experiments with Elman recurrent neural networks (Elman, 1990) trained on a natural language processing task. The task was to learn sequences of word categories in a text derived from a primary school reader. The grammar induced by the network was made explicit by cluster analysis which revealed both the representations formed during learning and enabled the construction o...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 1992